195 research outputs found

    Taking Inspiration from Flying Insects to Navigate inside Buildings

    Get PDF
    These days, flying insects are seen as genuinely agile micro air vehicles fitted with smart sensors and also parsimonious in their use of brain resources. They are able to visually navigate in unpredictable and GPS-denied environments. Understanding how such tiny animals work would help engineers to figure out different issues relating to drone miniaturization and navigation inside buildings. To turn a drone of ~1 kg into a robot, miniaturized conventional avionics can be employed; however, this results in a loss of their flight autonomy. On the other hand, to turn a drone of a mass between ~1 g (or less) and ~500 g into a robot requires an innovative approach taking inspiration from flying insects both with regard to their flapping wing propulsion system and their sensory system based mainly on motion vision in order to avoid obstacles in three dimensions or to navigate on the basis of visual cues. This chapter will provide a snapshot of the current state of the art in the field of bioinspired optic flow sensors and optic flow-based direct feedback loops applied to micro air vehicles flying inside buildings

    Optic Flow Based Autopilots: Speed Control and Obstacle Avoidance

    Get PDF
    International audienceThe explicit control schemes presented here explain how insects may navigate on the sole basis of optic flow (OF) cues without requiring any distance or speed measurements: how they take off and land, follow the terrain, avoid the lateral walls in a corridor and control their forward speed automatically. The optic flow regulator, a feedback system controlling either the lift, the forward thrust or the lateral thrust, is described. Three OF regulators account for various insect flight patterns observed over the ground and over still water, under calm and windy conditions and in straight and tapered corridors. These control schemes were simulated experimentally and/or implemented onboard two types of aerial robots, a micro helicopter (MH) and a hovercraft (HO), which behaved much like insects when placed in similar environments. These robots were equipped with opto-electronic OF sensors inspired by our electrophysiological findings on houseflies' motion sensitive visual neurons. The simple, parsimonious control schemes described here require no conventional avionic devices such as range finders, groundspeed sensors or GPS receivers. They are consistent with the the neural repertoire of flying insects and meet the low avionic payload requirements of autonomous micro aerial and space vehicles

    Optic flow based autopilot: From insects to rotorcraft and back

    Get PDF
    International audienceWhen insects are flying forwards, the image of the ground sweeps backwards across their ventral viewfield, forming an "optic flow", which depends on both the groundspeed and the height of flight. To explain how these animals manage to avoid the ground using this image motion cue, we suggest that insect navigation hinges on a visual feedback loop we have called the optic flow regulator, which controls the vertical lift. To test this idea, we built a micro-helicopter equipped with a fly-inspired optic flow sensor and an optic flow regulator. We showed that this fly-by-sight microrobot can perform exacting tasks such as takeoff , level flight and landing. Our control scheme accounts for many hitherto unexplained findings published during the last 70 years on insects' visually guided performances, including the facts that honeybees descend under headwind conditions, land with a constant slope and drown when travelling over mirror-smooth water. Our control scheme explains how insects manage to fly safely without any of the instruments used onboard aircraft to measure the height of flight, the airspeed, the groundspeed, and the descent speed. An optic flow regulator could be easily implemented neurally. It is just as appropriate for insects (1) as it would be for aircraft (2,3)

    Biomimetic visual navigation in a corridor: to centre or not to centre?

    Get PDF
    International audienceAs a first step toward an Automatic Flight Control System (AFCS) for Micro-Air Vehicle (MAV) obstacle avoidance, we introduce a vision based autopilot (LORA: Lateral Optic flow Regulation Autopilot), which is able to make a hovercraft automatically follow a wall or centre between the two walls of a corridor. A hovercraft is endowed with natural stabilization in pitch and roll while keeping two translational degrees of freedom (X and Y) and one rotational degree of freedom (yaw Ψ). We show the feasibility of an OF regulator that maintains the lateral Optic Flow (OF) on one wall equal to an OF set-point. The OF sensors used are Elementary Motion Detectors (EMDs), whose working was directly inspired by the housefly motion detecting neurons. The properties of these neurons were previously analysed at our laboratory by performing electrophysiological recordings while applying optical microstimuli to single photoreceptor cells of the compound eye. The simulation results show that depending on the OF set-point, the hovercraft either centres along the midline of the corridor or follows one of the two walls, even with local lack of optical texture on one wall, such as caused, for instance, by an open door or a T-junction. All these navigational tasks are performed with one and the same feedback loop, which consists of a lateral OF regulation loop that permits relatively high-speed navigation (1m/s, i.e 3 body-lengths per second). The passive visual sensors and the simple processing system are suitable for use with MAVs with an avionic payload of only a few grams. The goal is to achieve MAV automatic guidance or to relieve a remote operator from guiding it in challenging environments such as urban canyons or indoor environments

    Réguler le flux optique latéral pour naviguer dans un corridor

    Get PDF
    International audienceAs a first step toward an Automatic Flight Control System (AFCS) for Micro-Air Vehicle (MAV) obstacle avoidance, we introduce a vision based autopilot (LORA: Lateral Optic flow Regulation Autopilot), which is able to make a hovercraft automatically follow a wall or centre between the two walls of a corridor. A hovercraft is endowed with natural stabilization in pitch and roll while keeping two translational degrees of freedom (X and Y) and one rotational degree of freedom (yaw). We show the feasibility of an OF regulator that maintains the lateral Optic Flow (OF) on one wall equal to an OF set-point. The OF sensors used are Elementary Motion Detectors (EMDs), whose working was directly inspired by the housefly motion detecting neurons. The properties of these neurons were previously analysed at our laboratory by performing electrophysiological recordings while applying optical microstimuli to single photoreceptor cells of the compound eye. The simulation results show that depending on the OF set-point, the hovercraft either centres along the midline of the corridor or follows one of the two walls, even with local lack of optical texture on one wall, such as caused, for instance, by an open door or a T-junction. All these navigational tasks are performed with one and the same feedback loop, which consists of a lateral OF regulation loop that permits relatively high-speed navigation (1m/s, i.e 3 body lengths per second), with a minimalist visual system (only two EMDs, each EMD uses two pixels). This principle contrasts with the formerly proposed strategy that consists in equalizing the two lateral OFs. The passive visual sensors and the simple processing system are suitable for use with MAVs with an avionic payload of only a few grams. The goal is to achieve MAV automatic guidance or to relieve a remote operator from guiding it in challenging environments such as urban canyons or indoor environments

    Flying in 3D with an Insect based Visual Autopilot

    Get PDF
    International audienceFlying insects rely on Optic Flow (OF) cues to avoid collisions, control their speed, control their height, and land. Recent studies have shown that the principle of “OF regulation” may account for various behaviors observed in freely flying insects. The aim of the present study was to suggest a visually guided autopilot enabling an insect to navigate in 3D, and to test its robustness to natural images. Using computer-simulation experiments, we simulated a bee that flies through a tunnel wallpapered with natural images, by controlling both its ground speed and clearance all four sides: the lateral walls, the ground, and the ceiling. The simulated bee can translate along three directions (the surge, sway, and heave axes): it is therefore fully actuated. The new visuo-motor control system, called ALIS (AutopiLot using an Insect based vision System), is a dual OF regulator consisting of two interdependent feedback loops: the speed control loop (along the surge axis) and the positioning control loop (along both the sway and heave axes), each of which has its own OF set-point. The experiments show that the simulated bee navigates safely along a straight tunnel, while compensating for the major OF perturbations caused by, e.g., a tapering of the tunnel or the lack of texture on one wall. The minimalistic visual system used here (only eight pixels) is robust to naturally contrasted stimuli and tunnels, and is sufficient to control both the clearance from the four sides and the forward speed jointly, without requiring to measure any speeds or distances. Besides, the ALIS autopilot accounts remarkably for the quantitative results of ethological experiments performed on honeybees flying freely in straight or tapered corridors

    3D Navigation With An Insect-Inspired Autopilot

    No full text
    ISBN : 978-2-9532965-0-1Using computer-simulation experiments, we developed a vision-based autopilot that enables a ‘simulated bee' to travel along a tunnel by controlling both its speed and its clearance from the right wall, the left wall, the ground, and the ceiling. The flying agent can translate along three directions (surge, sway, and heave): the agent is therefore fully actuated. The visuo-motor control system, called ALIS (AutopiLot using an Insect based vision System), is a dual OF regulator consisting of two interdependent feedback loops, each of which has its own OF set-point. The experiments show that the simulated bee navigates safely along a straight tunnel, while reacting sensibly to the major OF perturbation caused by the presence of a tapered tunnel. The visual system is minimalistic (only eight pixels) and it suffices to control the clearance from the four walls and the forward speed jointly, without the need to measure any speeds and distances. The OF sensors and the simple visuo-motor control system developed here are suitable for use on MAVs with avionic payloads as small as a few grams. Besides, the ALIS autopilot accounts remarkably for the quantitative results of ethological experiments performed on honeybees flying freely in straight or tapered corridors

    Event-based visual guidance inspired by honeybees in a 3D tapered tunnel

    No full text
    International audience— In view of neuro-ethological findings on honeybees and our previously developed vision-based autopilot, in-silico experiments were performed in which a " simulated bee " was make to travel along a doubly tapering tunnel including for the first time event-based controllers. The " simulated bee " was equipped with: • a minimalistic compound eye comprising 10 local motion sensors measuring the optic flow magnitude, • two optic flow regulators updating the control signals whenever specific optic flow criteria changed, • and three event-based controllers taking into account the error signals, each one in charge of its own translational dynamics. A MORSE/Blender based simulator-engine delivered what each of 20 " simulated photoreceptors " saw in the tunnel lined with high resolution natural 2D images. The " simulated bee " managed to travel safely along the doubly tapering tunnel without requiring any speed or distance measurements, using only a Gibsonian point of view, by: • concomitantly adjusting the side thrust, vertical lift and forward thrust whenever a change was detected on the optic flow-based signal errors, • avoiding collisions with the surface of the doubly tapering tunnel and decreasing or increasing its speed, depending on the clutter rate perceived by motion sensors

    A bee in the corridor: centering and wall-following

    Get PDF
    International audienceIn an attempt to better understand the mechanism underlying lateral collision avoidance in 7 flying insects, we trained honeybees (Apis mellifera) to fly through a large (95cm-wide) flight 8 tunnel. We found that depending on the entrance and feeder positions, honeybees would 9 either center along the corridor midline or fly along one wall. Bees kept following one wall 10 even when a major (150cm-long) part of the opposite wall was removed. These findings 11 cannot be accounted for by the 'optic flow balance' hypothesis that has been put forward to 12 explain the typical bees' 'centering response' observed in narrower corridors. Both centering 13 and wall-following behaviours are well accounted for, however, by a mechanism called the 14 lateral optic flow regulator, i.e., a feedback system that strives to maintain the unilateral optic 15 flow constant. The power of this mechanism is that it would allow the bee to guide itself 16 visually in a corridor without having to measure its speed or distance from the walls
    corecore